# Multi-domain Pretraining
Tinyllama V1.1 Chinese
Apache-2.0
TinyLlama is a 1.1-billion-parameter small language model that adopts the same architecture and tokenizer as Llama 2, suitable for resource-constrained application scenarios.
Large Language Model
Transformers English

T
TinyLlama
447
9
Rugpt 3.5 13B
MIT
A 13-billion-parameter language model for Russian, pretrained on 300GB of multi-domain data with Russian perplexity around 8.8
Large Language Model
Transformers Supports Multiple Languages

R
ai-forever
4,538
281
Distilbert Mlm Best
DistilBERT is a lightweight distilled version of BERT, retaining 97% of BERT's performance while being 40% smaller and 60% faster.
Large Language Model
Transformers

D
vocab-transformers
26
0
Procbert
ProcBERT is a pre-trained language model specifically optimized for process texts, pre-trained on a large-scale corpus of process texts (including biomedical literature, chemical patents, and cooking recipes), demonstrating outstanding performance in downstream tasks.
Large Language Model
Transformers English

P
fbaigt
13
1
Featured Recommended AI Models